Combinatorial mutations and block diagonal polytopes

نویسندگان

چکیده

Matching fields were introduced by Sturmfels and Zelevinsky to study certain Newton polytopes, more recently have been shown give rise toric degenerations of various families varieties. Whenever a matching field gives degeneration, the associated polytope variety coincides with polytope. We combinatorial mutations, which are analogues cluster mutations for polytopes show that property giving degeneration Grassmannians, is preserved mutation. Moreover, arising through Newton-Okounkov bodies Grassmannians respect full-rank valuations. produce large family such extending so-called block diagonal fields.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Block Diagonal Majorization on $C_{0}$

Let $mathbf{c}_0$ be the real vector space of all real sequences which converge to zero. For every $x,yin mathbf{c}_0$, it is said that $y$ is block diagonal majorized by $x$ (written $yprec_b x$) if there exists a block diagonal row stochastic matrix $R$ such that $y=Rx$. In this paper we find the possible structure of linear functions $T:mathbf{c}_0rightarrow mathbf{c}_0$ preserving $prec_b$.

متن کامل

Block Diagonal Matrices

For simplicity, we adopt the following rules: i, j, m, n, k denote natural numbers, x denotes a set, K denotes a field, a, a1, a2 denote elements of K, D denotes a non empty set, d, d1, d2 denote elements of D, M , M1, M2 denote matrices over D, A, A1, A2, B1, B2 denote matrices over K, and f , g denote finite sequences of elements of N. One can prove the following propositions: (1) Let K be a ...

متن کامل

Block Diagonal Natural Evolution Strategies

The Natural Evolution Strategies (NES) family of search algorithms have been shown to be efficient black-box optimizers, but the most powerful version xNES does not scale to problems with more than a few hundred dimensions. And the scalable variant, SNES, potentially ignores important correlations between parameters. This paper introduces Block Diagonal NES (BD-NES), a variant of NES which uses...

متن کامل

Block-diagonal Hessian-free Optimization

Second-order methods for neural network optimization have several advantages over methods based on first-order gradient descent, including better scaling to large mini-batch sizes and fewer updates needed for convergence. But they are rarely applied to deep learning in practice because of high computational cost and the need for model-dependent algorithmic variations. We introduce a variant of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Collectanea Mathematica

سال: 2021

ISSN: ['2038-4815', '0010-0757']

DOI: https://doi.org/10.1007/s13348-021-00321-w